Trident Consulting is seeking a "Data Architect for one of our clients in Parsippany, NJ/ Onsite. A global leader in business and technology services
Job Title: Data Architect
Job Location: Parsippany, NJ/ Onsite
Job type: Contract
Required Skills: Python, Amazon Redshift, Amazon S3, Data Architect, Data Modelling, DB Performance Optimization
Job Summary:
We are seeking a highly skilled Sr. Architect with 12 to 15 years of experience to join our team.
The ideal candidate will have extensive experience in Cloud Data pipeline, with Architecting and modelling skills.
Must haves
- Experience in AWS and enterprise data warehousing project/ETL (building ETL pipeline), Enterprise Data Engineering and Analytics projects.
- Data Modelling design (ER/Dimensional Modelling) - Conceptual/Logical/Physical.
- Clear understanding Data warehousing and Data Lake concepts.
- Redshift implementation with hands-on experience in AWS.
- Understand business requirements and existing system designs, enterprise applications, IT security guidelines, Legal Protocols
- Should possess Data modelling experience and should be able to collaborate with other teams within project/program.
- Proven experience in data modelling & analysis, data migration strategy, cleanse and migrate large master data sets, data alignment across multiple applications, data governance.
- Should be able to assist in making technology choices and decisions in an enterprise architecture scenario.
- Should possess working experience in different database environment/applications like OLTP, OLAP etc.
- Design, build and operationalize data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- EMR, RedShift, Kinesis, Glue.
- Actively participate in optimization and performance tuning of data ingestion and SQL processes
- Knowledge on basic AWS services like S3, EC2, etc
- Experience in any of the following AWS Athena and Glue PySpark, EMR, Redshift
- Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
- Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming
- Analyze, re-architect, and re-platform on-premises data warehouses to data platforms on AWS cloud using AWS or 3rd party services
- Understand and implement security and version controls
- Support data engineers with design of ETL processes, code reviews, and knowledge-sharing
Roles and Responsibility
- Ability to explain data lake architecture using AWS services
- In-depth knowledge on AWS well-architected framework - Good programming skill using scripting (e.g., Python)
- Good to have experience on one ETL tool
- Clarify and finalize detailed scope for migration
- Conduct customer interviews to understand existing standards, policies and quality compliance, enterprise metadata standards
- Work with various SMEs in understanding business process flows, functional requirements specifications of existing system, their current challenges and constraints and future expectation
- Document current state & prepare target state architecture
- Excellent client interfacing and communication skills
- Very good understanding of Data Intelligence concepts technologies etc.
- Address customer issues with speed and efficiency
- Develop and manage relations with key client stakeholders
- Identify resources required for project completion
About Trident:
Trident Consulting is an award-winning IT/engineering staffing company founded in 2005 and headquartered in San Ramon, CA. We specialize in placing high-quality vetted technology and engineering professionals in contract and full-time roles. Trident's commitment is to deliver the best and brightest individuals in the industry for our clients' toughest requirements.
Some of our recent awards include:
• 2022, 2021, 2020 Inc. 5000 fastest-growing private companies in America
• 2022, 2021 SF Business Times 100 fastest-growing private companies in Bay Area